Simplest Working Tensorflow Notebook Ever


In [1]:
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
import math

Generating data with numpy


In [2]:
# The true value of the coefficients, which we will estimate hereafter
a_true, b_true = 1.3, -0.7

In [3]:
n_sample = 1000

Let's generate the samples

$ y_i = a x_i + b + \epsilon_i $

Where $\epsilon_i \sim \mathcal{N}(0,\sigma^2)$ with $\sigma = 0.1$


In [4]:
# Generate some abscissa
x = np.random.rand(n_sample)

In [5]:
# Generate the corresponding ordinate
y = a_true * x + b_true + np.random.randn(n_sample) * 0.1

In [6]:
plt.scatter(x,y)


Out[6]:
<matplotlib.collections.PathCollection at 0x7f604ca64278>

Finding the coefficients a and b that minimize the sum of the square errors

Random initialization of the variables to estimate

In [7]:
a_estimated = tf.Variable(tf.truncated_normal([1]))
b_estimated = tf.Variable(tf.truncated_normal([1]))
Designing a loss function to minimize the sum of the square error

In [8]:
loss = 0.0

In [9]:
for i_sample in range(n_sample):
    loss += tf.square(a_estimated * x[i_sample] + b_estimated - y[i_sample])

In [10]:
train_op = tf.train.AdamOptimizer(learning_rate=0.1, epsilon=0.1).minimize(loss)
Initialization of TensorFlow

In [11]:
sess = tf.InteractiveSession()
sess.run(tf.global_variables_initializer())

Gradient descent steps to optimize the value of a_estimated and b_estimated


In [12]:
n_iter = 100
loss_list = []

In [13]:
for i in range(n_iter):
    sess.run(train_op)
    loss_list.append(sess.run(loss))

Showing the losses for each iteration


In [14]:
plt.plot(loss_list)


Out[14]:
[<matplotlib.lines.Line2D at 0x7f5fffa8f518>]
And finally, let's see what are the estimated values

In [15]:
a_b_estimated = sess.run([a_estimated, b_estimated])

In [16]:
a_b_estimated


Out[16]:
[array([ 1.29090822], dtype=float32), array([-0.68512118], dtype=float32)]
Whereas the true values are

In [17]:
a_true, b_true


Out[17]:
(1.3, -0.7)

The values are close ! Great job.

Let's visualize the fitted affine function


In [18]:
a_est = a_b_estimated[0][0]

In [19]:
b_est = a_b_estimated[1][0]

In [20]:
y_est = a_est * x + b_est

In [21]:
plt.scatter(x,y,color = "b")
plt.scatter(x,y_est, color="r")


Out[21]:
<matplotlib.collections.PathCollection at 0x7f5ffc760780>

It fits rather well.